5 research outputs found

    Desarrollo de herramientas de apoyo para comparación forense de firmas manuscritas

    Full text link
    En este proyecto se estudia y desarrolla una aplicación software que incluye herramientas de apoyo forense para el cotejo de firmas manuscritas capturadas con dispositivos digitales. Concretamente se diseñan herramientas con funcionalidades que aprovechan la información capturada por dichos dispositivos (p.ej presión, velocidad, inclinación, etc), ofreciendo una gran ayuda para los peritos a la hora de realizar el cotejo pericial. La primera parte del proyecto se dedica a revisar la literatura para familiarizarse con los distintos recursos y herramientas de los que se dispone actualmente para el análisis biométrico en general y de firmas en particular. A continuación se estudia la metodología seguida tradicionalmente por los peritos forenses para el análisis de firmas manuscritas en papel, con el objetivo de implementar de la manera más fidedigna posible herramientas software que emulen estas metodologías para firmas digitales, incluso mejorarlas e implementar nuevas. Para la elaboración de las herramientas de apoyo se hace uso de la interfaz gráfica GUI (del inglés Graphical User Interface) de Matlab. El proyecto consta de una aplicación visual multiventana, en la que se programan las diferentes funcionalidades resumidas a continuación: La primera ventana permite visualizar varias firmas de forma simultánea sobre un mismo centro de representación, así como los desplazamientos en X e Y , y la variación de la presión con el tiempo. También se pueden realizar diversas operaciones sobre las firmas para poder permitir una comparación más sencilla como son rotar, normalizar, interpolar, etc. Una vez escogida una de las firmas, ésta se puede analizar de forma individual en la segunda ventana en la cual se pueden ver otras funcionalidades como son la de reproducir la ejecución de la firma, observar el trazado de la misma al vuelo (conjunto de muestras capturadas cuando el estilete no está apoyado), observar la firma con diferentes escalas de colores según la presión ejercida, etc. Por último, se podrán escoger trazos aislados de la firma para realizar un análisis más exhaustivo o trazos de dos firmas diferentes para poder compararlos en la última ventana. En la tercera ventana se pueden visualizar otras funcionalidades temporales de los trazos escogidos como la velocidad, aceleración, desplazamiento en X e Y , y la presión. También es posible calcular ángulos y distancias sobre los trazos con respecto al eje horizontal, activar zoom, visualizar las muestras del trazo, calcular el área que ocupa tanto en pulgadas como en centímetros, rotar y desplazar. Finalmente se incluyen las funcionalidades de fluidez temporal, temblor espacial, velocidad y aceleración media, y duración que servirán como una ayuda más a los peritos para determinar la autenticidad de la firma, obteniendo para ello una puntuación entre [0-10] y unas estadísticas poblacionales que indicarán si la puntuación obtenida pertenece a una firma genuina o falsa. Para ello se han desarrollado una serie de algoritmos en Matlab, cuyos resultados variarán según unos parámetros de entrada que serán necesarios optimizar para que puedan discernir de la mejor manera posible entrefirmas falsas y genuinas. Para poder obtener los parámetros óptimos, se hace uso de diferentes bases de datos así como de firmas capturadas con distintos dispositivos, a saber: Wacom STU-500 Wacom STU-530 Wacom DTU-1031 Tablet Samsung Galaxy Note 10.1 Tablet Samsung Ativ 7 Todos los dispositivos capturan para cada muestra la posición X e Y , el tiempo entre muestras y la presión ejercida con el estilete. Una vez implementadas las herramientas se realiza un pequeño estudio para comprobar el potencial de las mismas, usando para este n una base de datos con firmas tanto genuinas como falsificadas de todos los dispositivos mencionados anteriormente. Para esta parte experimental del proyecto se hace uso de la base de datos obtenida por el ATVS e-BioFirma y la base de datos Biosecure DS2.In this project, an application software based on forensic tools is studied and developed to support the comparison of handwriting signature done by digital devices. Mainly, tools with the ability to bene t from the information taken from these devices (ex. pressure, velocity, inclination, etc) are designed, which o ers a great help for the forensic handwriting examiners (FHE's) when they analyze the expert comparison. The rst part of the project is dedicated to review the literature to become familiar with the di erent resources and tools that are available nowadays for the biometrics analysis in general and particularly for signatures. In the second part, the methodology that FHE's traditionally use to analyze handwritten signatures on paper is studied, with the objective of implementing the most reliable software tools that simulate these methodologies for digital signatures, or even improve and implement new methodologies. For the implementation of the support tools the Graphical User Interface (application from Matlab) is used. The project consists of a visual application with multiple windows, in which we analyse the di erent functionalities that will now be reviewed: The rst window allows visualization of many signatures simultaneously on the same representation centre, as well as the movements of X and Y , and the temporal pressure variation. Also, numerous operations on signatures can be done to allow a simpler comparison such as rotation, normalization, interpolation, etc. Once a signature is chosen, it can be analysed individually in the second window in which you can nd other functionalities like reproducing the signature, observing the samples acquired when the pen is not touching the device, viewing the signature with di erent colour scales according to the pressure applied, etc. Lastly, isolated strokes from the signature can be chosen and be exhaustively analysed or strokes from two di erent signatures can be compared in the last window. In the third window we can visualize other temporal functionalities of the chosen strokes such as velocity, acceleration, movements in X and Y , and pressure. It is also possible to calculate angles and distances of the strokes considering the horizontal axe, to activate zoom, to visualize the sample strokes, to calculate the area that it occupies in inches or centimetres, to rotate and displace. Finally, functionalities such as temporal uency, spatial tremor, average velocity and acceleration, and the duration will be included. These ve functionalities will assist the FHE's to conclude the authenticity of the signature, this way obtaining a score between [0-10] and population statistics that will estimate if the calculated score belongs to a genuine or forged signature. In order to do this, a series of algorithms from Matlab have been developed, the results from these algorithms can vary depending on initial parameters that will be optimised in order to distinguish between genuine and forged signatures. To estimate these optimal parameters, di erent databases with signatures acquired from several devices are used, these are: Wacom STU-500 Wacom STU-530 Wacom DTU-1031 Tablet Samsung Galaxy Note 10.1 Tablet Samsung Ativ 7 All the devices acquire the position X and Y for each sample, the time between them and the pressure applied on the pen

    e-BioSign Tool: Towards Scientific Assessment of Dynamic Signatures under Forensic Conditions

    Full text link
    Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. R. Vera-Rodriguez, J. Fierrez, J. Ortega-Garcia, A. Acien and R. Tolosana, "e-BioSign tool: Towards scientific assessment of dynamic signatures under forensic conditions," 2015 IEEE 7th International Conference on Biometrics Theory, Applications and Systems (BTAS), Arlington, VA, 2015, pp. 1-6. doi: 10.1109/BTAS.2015.7358756This paper presents a new tool specifically designed to carry out dynamic signature forensic analysis and give sci- entific support to forensic handwriting examiners (FHEs). Traditionally FHEs have performed forensic analysis of paper-based signatures for court cases, but with the rapid evolution of the technology, nowadays they are being asked to carry out analysis based on signatures acquired by digi- tizing tablets more and more often. In some cases, an option followed has been to obtain a paper impression of these sig- natures and carry out a traditional analysis, but there are many deficiencies in this approach regarding the low spa- tial resolution of some devices compared to original off-line signatures and also the fact that the dynamic information, which has been proved to be very discriminative by the bio- metric community, is lost and not taken into account at all. The tool we present in this paper allows the FHEs to carry out a forensic analysis taking into account both the tra- ditional off-line information normally used in paper-based signature analysis, and also the dynamic information of the signatures. Additionally, the tool incorporates two impor- tant functionalities, the first is the provision of statistical support to the analysis by including population statistics for genuine and forged signatures for some selected features, and the second is the incorporation of an automatic dy- namic signature matcher, from which a likelihood ratio (LR) can be obtained from the matching comparison between the known and questioned signatures under analysis.This work was supported in part by the Project Bio-Shield (TEC2012-34881), in part by Cecabank e-BioFirma Contract, in part by the BEAT Project (FP7-SEC-284989) and in part by Catedra UAM-Telefonica

    BeCAPTCHA: Behavioral bot detection using touchscreen and mobile sensors benchmarked on HuMIdb

    Full text link
    In this paper we study the suitability of a new generation of CAPTCHA methods based on smartphone interactions. The heterogeneous flow of data generated during the interaction with the smartphones can be used to model human behavior when interacting with the technology and improve bot detection algorithms. For this, we propose BeCAPTCHA, a CAPTCHA method based on the analysis of the touchscreen information obtained during a single drag and drop task in combination with the accelerometer data. The goal of BeCAPTCHA is to determine whether the drag and drop task was realized by a human or a bot. We evaluate the method by generating fake samples synthesized with Generative Adversarial Neural Networks and handcrafted methods. Our results suggest the potential of mobile sensors to characterize the human behavior and develop a new generation of CAPTCHAs. The experiments are evaluated with HuMIdb1 (Human Mobile Interaction database), a novel multimodal mobile database that comprises 14 mobile sensors acquired from 600 users. HuMIdb is freely available to the research communityThis work has been supported by projects: PRIMA, Spain (H2020-MSCA-ITN-2019-860315), TRESPASS-ETN, Spain (H2020-MSCA-ITN-2019-860813), BIBECA RTI2018-101248-B-I00 (MINECO/FEDER), and BioGuard, Spain (Ayudas Fundación BBVA a Equipos de Investigación Científica 2017). Spanish Patent Application P20203006

    TypeNet: Scaling up keystroke biometrics

    Full text link
    © 2020 IEEE.  Personal use of this material is permitted.  Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other worksWe study the suitability of keystroke dynamics to authenticate 100 K users typing free-text. For this, we first analyze to what extent our method based on a Siamese Recurrent Neural Network (RNN) is able to authenticate users when the amount of data per user is scarce, a common scenario in free-text keystroke authentication. With 1 K users for testing the network, a population size comparable to previous works, TypeNet obtains an equal error rate of 4.8% using only 5 enrollment sequences and 1 test sequence per user with 50 keystrokes per sequence. Using the same amount of data per user, as the number of test users is scaled up to 100K, the performance in comparison to 1 K decays relatively by less than 5%, demonstrating the potential of Type-Net to scale well at large scale number of users. Our experiments are conducted with the Aalto University keystroke database. To the best of our knowledge, this is the largest free-text keystroke database captured with more than 136M keystrokes from 168K usersThis work has been supported by projects: PRIMA (MSCA-ITN-2019-860315), TRESPASS (MSCAITN-2019-860813), BIBECA (RTI2018-101248-B-I00 MINECO), and by edBB (UAM). A. Acien is supported by a FPI fellowship from the Spanish MINEC

    Mobile behavioral biometrics for passive authentication

    Full text link
    Current mobile user authentication systems based on PIN codes, fingerprint, and face recognition have several shortcomings. Such limitations have been addressed in the literature by exploring the feasibility of passive authentication on mobile devices through behavioral biometrics. In this line of research, this work carries out a comparative analysis of unimodal and multimodal behavioral biometric traits acquired while the subjects perform different activities on the phone such as typing, scrolling, drawing a number, and tapping on the screen, considering the touchscreen and the simultaneous background sensor data (accelerometer, gravity sensor, gyroscope, linear accelerometer, and magnetometer). Our experiments are performed over HuMIdb,1 one of the largest and most comprehensive freely available mobile user interaction databases to date. A separate Recurrent Neural Network (RNN) with triplet loss is implemented for each single modality. Then, the weighted fusion of the different modalities is carried out at score level. In our experiments, the most discriminative background sensor is the magnetometer, whereas among touch tasks the best results are achieved with keystroke in a fixed-text scenario. In all cases, the fusion of modalities is very beneficial, leading to Equal Error Rates (EER) ranging from 4% to 9% depending on the modality combination in a 3-second intervalThis project has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement no. 860315, and from Orange Labs. R. Tolosana and R. Vera-Rodriguez are also supported by INTER-ACTION (PID2021-126521OB-I00 MICINN/FEDER
    corecore